219 research outputs found

    Comparative genomic analysis of pleurotus species reveals insights into the evolution and coniferous utilization of Pleurotus placentodes

    Get PDF
    Pleurotus placentodes (PPL) and Pleurotus cystidiosus (PCY) are economically valuable species. PPL grows on conifers, while PCY grows on broad-leaved trees. To reveal the genetic mechanism behind PPL’s adaptability to conifers, we performed de novo genome sequencing and comparative analysis of PPL and PCY. We determined the size of the genomes for PPL and PCY to be 36.12 and 42.74 Mb, respectively, and found that they contain 10,851 and 15,673 protein-coding genes, accounting for 59.34% and 53.70% of their respective genome sizes. Evolution analysis showed PPL was closely related to P. ostreatus with the divergence time of 62.7 MYA, while PCY was distantly related to other Pleurotus species with the divergence time of 111.7 MYA. Comparative analysis of carbohydrate-active enzymes (CAZYmes) in PPL and PCY showed that the increase number of CAZYmes related to pectin and cellulose degradation (e.g., AA9, PL1) in PPL may be important for the degradation and colonization of conifers. In addition, geraniol degradation and peroxisome pathways identified by comparative genomes should be another factors for PPL’s tolerance to conifer substrate. Our research provides valuable genomes for Pleurotus species and sheds light on the genetic mechanism of PPL’s conifer adaptability, which could aid in breeding new Pleurotus varieties for coniferous utilization

    Deep Learning in Single-Cell Analysis

    Full text link
    Single-cell technologies are revolutionizing the entire field of biology. The large volumes of data generated by single-cell technologies are high-dimensional, sparse, heterogeneous, and have complicated dependency structures, making analyses using conventional machine learning approaches challenging and impractical. In tackling these challenges, deep learning often demonstrates superior performance compared to traditional machine learning methods. In this work, we give a comprehensive survey on deep learning in single-cell analysis. We first introduce background on single-cell technologies and their development, as well as fundamental concepts of deep learning including the most popular deep architectures. We present an overview of the single-cell analytic pipeline pursued in research applications while noting divergences due to data sources or specific applications. We then review seven popular tasks spanning through different stages of the single-cell analysis pipeline, including multimodal integration, imputation, clustering, spatial domain identification, cell-type deconvolution, cell segmentation, and cell-type annotation. Under each task, we describe the most recent developments in classical and deep learning methods and discuss their advantages and disadvantages. Deep learning tools and benchmark datasets are also summarized for each task. Finally, we discuss the future directions and the most recent challenges. This survey will serve as a reference for biologists and computer scientists, encouraging collaborations.Comment: 77 pages, 11 figures, 15 tables, deep learning, single-cell analysi

    The state of the Martian climate

    Get PDF
    60°N was +2.0°C, relative to the 1981–2010 average value (Fig. 5.1). This marks a new high for the record. The average annual surface air temperature (SAT) anomaly for 2016 for land stations north of starting in 1900, and is a significant increase over the previous highest value of +1.2°C, which was observed in 2007, 2011, and 2015. Average global annual temperatures also showed record values in 2015 and 2016. Currently, the Arctic is warming at more than twice the rate of lower latitudes

    Search for Neutral Higgs Bosons in Events with Multiple Bottom Quarks at the Tevatron

    Get PDF
    The combination of searches performed by the CDF and D0 collaborations at the Fermilab Tevatron Collider for neutral Higgs bosons produced in association with b quarks is reported. The data, corresponding to 2.6 fb-1 of integrated luminosity at CDF and 5.2 fb-1 at D0, have been collected in final states containing three or more b jets. Upper limits are set on the cross section multiplied by the branching ratio varying between 44 pb and 0.7 pb in the Higgs boson mass range 90 to 300 GeV, assuming production of a narrow scalar boson. Significant enhancements to the production of Higgs bosons can be found in theories beyond the standard model, for example in supersymmetry. The results are interpreted as upper limits in the parameter space of the minimal supersymmetric standard model in a benchmark scenario favoring this decay mode.Comment: 10 pages, 2 figure

    Activated Met Signalling in the Developing Mouse Heart Leads to Cardiac Disease

    Get PDF
    BACKGROUND: The Hepatocyte Growth Factor (HGF) is a pleiotropic cytokine involved in many physiological processes, including skeletal muscle, placenta and liver development. Little is known about its role and that of Met tyrosine kinase receptor in cardiac development. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we generated two transgenic mice with cardiac-specific, tetracycline-suppressible expression of either Hepatocyte Growth Factor (HGF) or the constitutively activated Tpr-Met kinase to explore: i) the effect of stimulation of the endogenous Met receptor by autocrine production of HGF and ii) the consequence of sustained activation of Met signalling in the heart. We first showed that Met is present in the neonatal cardiomyocytes and is responsive to exogenous HGF. Exogenous HGF starting from prenatal stage enhanced cardiac proliferation and reduced sarcomeric proteins and Connexin43 (Cx43) in newborn mice. As adults, these transgenics developed systolic contractile dysfunction. Conversely, prenatal Tpr-Met expression was lethal after birth. Inducing Tpr-Met expression during postnatal life caused early-onset heart failure, characterized by decreased Cx43, upregulation of fetal genes and hypertrophy. CONCLUSIONS/SIGNIFICANCE: Taken together, our data show that excessive activation of the HGF/Met system in development may result in cardiac damage and suggest that Met signalling may be implicated in the pathogenesis of cardiac disease

    Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set

    Get PDF
    We report a measurement of the bottom-strange meson mixing phase \beta_s using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays in which the quark-flavor content of the bottom-strange meson is identified at production. This measurement uses the full data set of proton-antiproton collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity. We report confidence regions in the two-dimensional space of \beta_s and the B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2, -1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in agreement with the standard model expectation. Assuming the standard model value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +- 0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +- 0.009 (syst) ps, which are consistent and competitive with determinations by other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012

    The Liver Tumor Segmentation Benchmark (LiTS)

    Full text link
    In this work, we report the set-up and results of the Liver Tumor Segmentation Benchmark (LITS) organized in conjunction with the IEEE International Symposium on Biomedical Imaging (ISBI) 2016 and International Conference On Medical Image Computing Computer Assisted Intervention (MICCAI) 2017. Twenty four valid state-of-the-art liver and liver tumor segmentation algorithms were applied to a set of 131 computed tomography (CT) volumes with different types of tumor contrast levels (hyper-/hypo-intense), abnormalities in tissues (metastasectomie) size and varying amount of lesions. The submitted algorithms have been tested on 70 undisclosed volumes. The dataset is created in collaboration with seven hospitals and research institutions and manually reviewed by independent three radiologists. We found that not a single algorithm performed best for liver and tumors. The best liver segmentation algorithm achieved a Dice score of 0.96(MICCAI) whereas for tumor segmentation the best algorithm evaluated at 0.67(ISBI) and 0.70(MICCAI). The LITS image data and manual annotations continue to be publicly available through an online evaluation system as an ongoing benchmarking resource.Comment: conferenc

    Graphene-Based Nanocomposites for Energy Storage

    Get PDF
    Since the first report of using micromechanical cleavage method to produce graphene sheets in 2004, graphene/graphene-based nanocomposites have attracted wide attention both for fundamental aspects as well as applications in advanced energy storage and conversion systems. In comparison to other materials, graphene-based nanostructured materials have unique 2D structure, high electronic mobility, exceptional electronic and thermal conductivities, excellent optical transmittance, good mechanical strength, and ultrahigh surface area. Therefore, they are considered as attractive materials for hydrogen (H2) storage and high-performance electrochemical energy storage devices, such as supercapacitors, rechargeable lithium (Li)-ion batteries, Li–sulfur batteries, Li–air batteries, sodium (Na)-ion batteries, Na–air batteries, zinc (Zn)–air batteries, and vanadium redox flow batteries (VRFB), etc., as they can improve the efficiency, capacity, gravimetric energy/power densities, and cycle life of these energy storage devices. In this article, recent progress reported on the synthesis and fabrication of graphene nanocomposite materials for applications in these aforementioned various energy storage systems is reviewed. Importantly, the prospects and future challenges in both scalable manufacturing and more energy storage-related applications are discussed

    Finding New Genes for Non-Syndromic Hearing Loss through an In Silico Prioritization Study

    Get PDF
    At present, 51 genes are already known to be responsible for Non-Syndromic hereditary Hearing Loss (NSHL), but the knowledge of 121 NSHL-linked chromosomal regions brings to the hypothesis that a number of disease genes have still to be uncovered. To help scientists to find new NSHL genes, we built a gene-scoring system, integrating Gene Ontology, NCBI Gene and Map Viewer databases, which prioritizes the candidate genes according to their probability to cause NSHL. We defined a set of candidates and measured their functional similarity with respect to the disease gene set, computing a score () that relies on the assumption that functionally related genes might contribute to the same (disease) phenotype. A Kolmogorov-Smirnov test, comparing the pair-wise distribution on the disease gene set with the distribution on the remaining human genes, provided a statistical assessment of this assumption. We found at a p-value that the former pair-wise is greater than the latter, justifying a prioritization strategy based on the functional similarity of candidate genes respect to the disease gene set. A cross-validation test measured to what extent the ranking for NSHL is different from a random ordering: adding 15% of the disease genes to the candidate gene set, the ranking of the disease genes in the first eight positions resulted statistically different from a hypergeometric distribution with a p-value and a power. The twenty top-scored genes were finally examined to evaluate their possible involvement in NSHL. We found that half of them are known to be expressed in human inner ear or cochlea and are mainly involved in remodeling and organization of actin formation and maintenance of the cilia and the endocochlear potential. These findings strongly indicate that our metric was able to suggest excellent NSHL candidates to be screened in patients and controls for causative mutations

    Evaluation of individual and ensemble probabilistic forecasts of COVID-19 mortality in the United States

    Get PDF
    Short-term probabilistic forecasts of the trajectory of the COVID-19 pandemic in the United States have served as a visible and important communication channel between the scientific modeling community and both the general public and decision-makers. Forecasting models provide specific, quantitative, and evaluable predictions that inform short-term decisions such as healthcare staffing needs, school closures, and allocation of medical supplies. Starting in April 2020, the US COVID-19 Forecast Hub (https://covid19forecasthub.org/) collected, disseminated, and synthesized tens of millions of specific predictions from more than 90 different academic, industry, and independent research groups. A multimodel ensemble forecast that combined predictions from dozens of groups every week provided the most consistently accurate probabilistic forecasts of incident deaths due to COVID-19 at the state and national level from April 2020 through October 2021. The performance of 27 individual models that submitted complete forecasts of COVID-19 deaths consistently throughout this year showed high variability in forecast skill across time, geospatial units, and forecast horizons. Two-thirds of the models evaluated showed better accuracy than a naïve baseline model. Forecast accuracy degraded as models made predictions further into the future, with probabilistic error at a 20-wk horizon three to five times larger than when predicting at a 1-wk horizon. This project underscores the role that collaboration and active coordination between governmental public-health agencies, academic modeling teams, and industry partners can play in developing modern modeling capabilities to support local, state, and federal response to outbreaks
    corecore